Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

نویسندگان

چکیده

This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types minimization problems: a (single) differentiable function and sum non-smooth function. In first case, adopt fixed step size avoid traditional line search establish convergence analysis proposed algorithm quadratic problem. acceleration is further incorporated with operator splitting technique deal second case. We use convex $$\ell _1$$ nonconvex _1-\ell _2$$ functionals as case studies demonstrate efficiency approaches over methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent

Nesterov's accelerated gradient descent (AGD), an instance of the general family of"momentum methods", provably achieves faster convergence rate than gradient descent (GD) in the convex setting. However, whether these methods are superior to GD in the nonconvex setting remains open. This paper studies a simple variant of AGD, and shows that it escapes saddle points and finds a second-order stat...

متن کامل

Asynchronous Accelerated Stochastic Gradient Descent

Stochastic gradient descent (SGD) is a widely used optimization algorithm in machine learning. In order to accelerate the convergence of SGD, a few advanced techniques have been developed in recent years, including variance reduction, stochastic coordinate sampling, and Nesterov’s acceleration method. Furthermore, in order to improve the training speed and/or leverage larger-scale training data...

متن کامل

A New Descent Nonlinear Conjugate Gradient Method for Unconstrained Optimization

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

متن کامل

Norm descent conjugate gradient methods for solving symmetric nonlinear equations

Nonlinear conjugate gradient method is very popular in solving large-scale unconstrained minimization problems due to its simple iterative form and lower storage requirement. In the recent years, it was successfully extended to solve higher-dimension monotone nonlinear equations. Nevertheless, the research activities on conjugate gradient method in symmetric equations are just beginning. This s...

متن کامل

Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property

Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Scientific Computing

سال: 2023

ISSN: ['1573-7691', '0885-7474']

DOI: https://doi.org/10.1007/s10915-023-02148-y